Universal and Succinct Source Coding of Deep Neural Networks

نویسندگان

چکیده

Deep neural networks have shown incredible performance for inference tasks in a variety of domains, but require significant storage space, which limits scaling and use on-device intelligence. This paper is concerned with finding universal lossless compressed representations deep feedforward synaptic weights drawn from discrete sets, directly performing without full decompression. The basic insight that allows less rate than naïve approaches recognizing the bipartite graph layers kind permutation invariance to labeling nodes, terms inferential operation. We provide efficient algorithms dissipate this irrelevant uncertainty then arithmetic coding nearly achieve entropy bound manner. also experimental results our approach on several standard datasets.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Universal Source Coding

The material covered in this thesis falls mainly in two parts. In the first part, which is found in chapter II, the foundations of source coding are considered. A careful introduction to the notion of codes for symbols, blocks and infinite sequences is given. The classical notion of a stationary ergodic source is then introduced as a natural concept arising when assuming limiting statistical co...

متن کامل

A universal VAD based on jointly trained deep neural networks

In this paper, we propose a joint training approach to voice activity detection (VAD) to address the issue of performance degradation due to unseen noise conditions. Two key techniques are integrated into this deep neural network (DNN) based VAD framework. First, a regression DNN is trained to map the noisy to clean speech features similar to DNN-based speech enhancement. Second, the VAD part t...

متن کامل

Universal Source Coding of Finite Alphabet Sources

The application of Shannon’s rate distortion theory to real data compression practice has been limited by a lack of effective distortion measures for most real users and incomplete statistical knowledge of real sources. Universal source coding techniques attempt to overcome the latter limitation by developing codes that are effective (in the sense of approaching the theoretical rate distortion ...

متن کامل

Universal Deep Neural Network Compression

Compression of deep neural networks (DNNs) for memoryand computation-efficient compact feature representations becomes a critical problem particularly for deployment of DNNs on resource-limited platforms. In this paper, we investigate lossy compression of DNNs by weight quantization and lossless source coding for memory-efficient inference. Whereas the previous work addressed non-universal scal...

متن کامل

End-to-End Optimized Speech Coding with Deep Neural Networks

Modern compression algorithms are often the result of laborious domain-specific research; industry standards such as MP3, JPEG, and AMR-WB took years to develop and were largely hand-designed. We present a deep neural network model which optimizes all the steps of a wideband speech coding pipeline (compression, quantization, entropy coding, and decompression) end-to-end directly from raw speech...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE journal on selected areas in information theory

سال: 2022

ISSN: ['2641-8770']

DOI: https://doi.org/10.1109/jsait.2023.3261819